Efficient Transformers: A Survey

نویسندگان

چکیده

Transformer model architectures have garnered immense interest lately due to their effectiveness across a range of domains like language, vision, and reinforcement learning. In the field natural language processing for example, Transformers become an indispensable staple in modern deep learning stack. Recently, dizzying number “X-former” models been proposed—Reformer, Linformer, Performer, Longformer, name few—which improve upon original architecture, many which make improvements around computational memory efficiency . With aim helping avid researcher navigate this flurry, article characterizes large thoughtful selection recent efficiency-flavored models, providing organized comprehensive overview existing work multiple domains.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient Runtime Representation of Evaluation Transformers

This article outlines an efficient runtime representation of evaluation transformers. An evaluator space containing parameterised evaluators for structured and polymorphic types is introduced. Using this framework it is possible to handle runtime information about evaluators for arbitrary structured types as well as evaluation transformers for functions over polymorphic types. In future develop...

متن کامل

a structural survey of the polish posters

تصویرسازی قابلیتهای فراوانی را دارا است

15 صفحه اول

Load Shedding – an Efficient Use of Ltc Transformers

This paper focuses on a methodology to deal with load shedding in electric power systems such as to make its impact on the demand level as minimal as possible. The power demand is parameterized by the factor α, which is minimized subject to equality (real and reactive power mismatches) and inequality constraints (operational and equipment limits). Lower and upper limits on voltage magnitudes an...

متن کامل

Monad Transformers as Monoid Transformers

The incremental approach to modular monadic semantics constructs complex monads by using monad transformers to add computational features to a preexisting monad. A complication of this approach is that the operations associated to the pre-existing monad need to be lifted to the new monad. In a companion paper by Jaskelioff, the lifting problem has been addressed in the setting of system Fω. Her...

متن کامل

Digital FIR Hilbert Transformers: Fundamentals and Efficient Design Methods

© 2012 Troncoso Romero and Jovanovic Dolecek, licensee InTech. This is an open access chapter distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Digital FIR Hilbert Transformers: Fundamentals and Efficient ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: ACM Computing Surveys

سال: 2022

ISSN: ['0360-0300', '1557-7341']

DOI: https://doi.org/10.1145/3530811